Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Q J Nucl Med Mol Imaging ; 67(3): 183-190, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37646239

RESUMO

Guidelines for bone scintigraphy are well established and recommend the use of planar early phase images to investigate a number of clinical indications. With recent advances in gamma camera technology the use of SPECT/CT imaging in the early phases is now possible, offering the potential of improved diagnostic confidence and prognostic value. To date little work has been carried out to optimize the acquisition of early phase bone images using SPECT/CT with most of the available studies acquiring SPECT images after the traditional planar images to allow comparison of the two techniques. Imaging durations of 7 to 10 minutes have been commonly used. However, the use of iterative reconstruction algorithms has been investigated with rapid SPECT imaging to allow imaging durations as low as 4 minutes. The use of CZT based systems with increased sensitivity and improved energy and spatial resolution also offers the potential to reduce imaging times. The optimization of projection measurement order has been investigated as a method of reducing image artefacts as a result of changing tracer distribution during the SPECT acquisition. In this article we consider the current state of early phase SPECT imaging and possible areas for future investigation as well as recommendations for departments looking to adopt blood pool SPECT imaging as part of their routine clinical practice.


Assuntos
Tomografia Computadorizada com Tomografia Computadorizada de Emissão de Fóton Único , Tomografia Computadorizada de Emissão de Fóton Único , Humanos , Tomografia Computadorizada por Raios X , Algoritmos , Artefatos
2.
Nucl Med Commun ; 35(11): 1096-106, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25144565

RESUMO

INTRODUCTION: An audit was carried out into UK glomerular filtration rate (GFR) calculation. The results were compared with an identical 2001 audit. METHODS: Participants used their routine method to calculate GFR for 20 data sets (four plasma samples) in millilitres per minute and also the GFR normalized for body surface area. Some unsound data sets were included to analyse the applied quality control (QC) methods. Variability between centres was assessed for each data set, compared with the national median and a reference value calculated using the method recommended in the British Nuclear Medicine Society guidelines. The influence of the number of samples on variability was studied. Supplementary data were requested on workload and methodology. RESULTS: The 59 returns showed widespread standardization. The applied early exponential clearance correction was the main contributor to the observed variability. These corrections were applied by 97% of centres (50% - 2001) with 80% using the recommended averaged Brochner-Mortenson correction. Approximately 75% applied the recommended Haycock body surface area formula for adults (78% for children). The effect of the number of samples used was not significant. There was wide variability in the applied QC techniques, especially in terms of the use of the volume of distribution. CONCLUSION: The widespread adoption of the guidelines has harmonized national GFR calculation compared with the previous audit. Further standardization could further reduce variability. This audit has highlighted the need to address the national standardization of QC methods. Radionuclide techniques are confirmed as the preferred method for GFR measurement when an unequivocal result is required.


Assuntos
Auditoria Clínica , Taxa de Filtração Glomerular , Testes de Função Renal/métodos , Plasma/metabolismo , Adulto , Criança , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Guias de Prática Clínica como Assunto , Controle de Qualidade , Reino Unido
3.
Nucl Med Commun ; 35(5): 511-21, 2014 May.
Artigo em Inglês | MEDLINE | ID: mdl-24448215

RESUMO

INTRODUCTION: The Nuclear Medicine Software Quality Group of the Institute of Physics and Engineering in Medicine has conducted a multicentre, multivendor audit to evaluate the use of resolution recovery software from several manufacturers when applied to myocardial perfusion data with half the normal counts acquired under a variety of clinical protocols in a range of departments. The objective was to determine whether centres could obtain clinical results with half-count data processed with resolution recovery software that were equivalent to those obtained using their normal protocols. MATERIALS AND METHODS: Sixteen centres selected 50 routine myocardial perfusion studies each, from which the Nuclear Medicine Software Quality Group generated simulated half-count studies using Poisson resampling. These half-count studies were reconstructed using resolution recovery and the clinical reports compared with the original reports from the full-count data. A total of 769 patient studies were processed and compared. RESULTS: Eight centres found only a small number of clinically relevant discrepancies between the two reports, whereas eight had an unacceptably high number of discrepancies. There were no significant differences in acquisition parameters between the two groups, although centres finding a high number of discrepancies were more likely to perform both rest and stress scans on normal studies. CONCLUSION: Half of the participating centres could potentially make use of resolution recovery to reduce the administered activity for myocardial perfusion scans without changing their routine acquisition protocols. The other half could consider adjusting the reconstruction parameters used with their resolution recovery software if they wish to use reduced activity successfully.


Assuntos
Processamento de Imagem Assistida por Computador , Auditoria Médica , Imagem de Perfusão do Miocárdio , Software , Humanos , Volume Sistólico
4.
Med Phys ; 40(8): 082506, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23927351

RESUMO

PURPOSE: Attenuation correction is essential for reliable interpretation of emission tomography, however, the use of transmission measurements to generate attenuation maps is limited by availability of equipment and potential mismatches between the transmission and emission measurements. The authors present a first step toward a method of estimating an attenuation map from measured scatter data without a transmission scan. METHODS: A scatter model has been developed that accurately predicts the distribution of photons which have been scattered once. The scatter model has been used as the basis of a maximum likelihood gradient ascent method to estimate an attenuation map from measured scatter data. In order to estimate both the attenuation map and activity distribution, iterations of the derived scatter based algorithm have been alternated with the maximum likelihood expectation maximization algorithm in a joint estimation process. For each iteration of the attenuation map estimation, the activity distribution is fixed at the values estimated during the previous activity iteration, and in each iteration of the activity distribution estimation the attenuation map is fixed at the values estimated during the previous attenuation iteration. The use of photopeak data to enhance the estimation of the attenuation map compared to the use of scatter data alone has also been considered. The algorithm derived has been used to reconstruct data simulated for an idealized two-dimensional situation and using a physical phantom. RESULTS: The reconstruction of idealized data demonstrated good reconstruction of both the activity distribution and attenuation map. The inclusion of information recorded in the photopeak energy window in the attenuation map estimation step demonstrated an improvement in the accuracy of the reconstruction, enabling an accurate attenuation map to be recovered. Validation of the results with physical phantom data demonstrated that different regions of attenuation could be distinguished in a real situation and produces results that represent a promising first step toward the use of scatter data to estimate the activity distribution and attenuation map from single photon emission tomography (SPECT) data without a transmission scan. CONCLUSIONS: The technique presented shows promise as a method of attenuation correction for SPECT data without the need for a separate transmission scan. Further work is required to further improve the method and to compensate for the assumptions used in the scatter model.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Espalhamento de Radiação , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Algoritmos , Funções Verossimilhança , Reprodutibilidade dos Testes
5.
Nucl Med Commun ; 34(10): 990-1004, 2013 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-23880898

RESUMO

PURPOSE: The aim of the study was to evaluate UK-wide interinstitutional reproducibility of left-ventricular functional parameters, end-systolic volume, end-diastolic volume and ejection fraction, obtained from gated myocardial perfusion imaging (GMPI) studies using technetium-99m-labelled radiopharmaceuticals. The study was carried out by the UK Institute of Physics and Engineering in Medicine Nuclear Medicine Software Quality Group. MATERIALS AND METHODS: Ten anonymized clinical GMPI studies, five with normal perfusion and five with perfusion defects, were made available in DICOM and proprietary formats for download and through manufacturers' representatives. Two of the studies were duplicated in order to assess intraoperator repeatability, giving a total of 12 studies. Studies were made available in 8 and 16 frames/cycle. RESULTS: A total of 58 institutions across England, Scotland, Wales and Northern Ireland participated in this study using six different computer packages. Studies were processed at centres using their normal clinical computers and software. The overall mean±SD ejection fraction for all centres was 58.5±3%; the mean end-diastolic volume was 114±12 ml and the mean end-systolic volume was 54±6 ml. The results were affected by the number of frames per cycle and by the postprocessing computer package, but not by the reconstruction filter in the filtered back-projection. CONCLUSION: Calculation of functional parameters from GMPI using technetium-99m-labelled radiopharmaceuticals is reliable and shows limited variability across the UK.


Assuntos
Técnicas de Imagem de Sincronização Cardíaca/normas , Auditoria Médica , Imagem de Perfusão do Miocárdio/normas , Função Ventricular Esquerda , Idoso , Humanos , Processamento de Imagem Assistida por Computador , Pessoa de Meia-Idade , Variações Dependentes do Observador , Reino Unido
6.
Nucl Med Commun ; 34(8): 796-805, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23660761

RESUMO

AIM: The Nuclear Medicine Software Quality Group of the Institute of Physics and Engineering in Medicine has conducted an audit to compare the ways in which different manufacturers implement the filters used in single-photon emission computed tomography. The aim of the audit was to identify differences between manufacturers' implementations of the same filter and to find means for converting parameters between systems. METHODS: Computer-generated data representing projection images of an ideal test object were processed using seven different commercial nuclear medicine systems. Images were reconstructed using filtered back projection and a Butter worth filter with three different cutoff frequencies and three different orders. RESULTS: The audit found large variations between the frequency-response curves of what were ostensibly the same filters on different systems. The differences were greater than could be explained simply by different Butter worth formulae. Measured cutoff frequencies varied between 40 and 180% of that expected. There was also occasional confusion with respect to frequency units. CONCLUSION: The audit concluded that the practical implementation of filtering, such as the size of the kernel, has a profound effect on the results, producing large differences between systems. Nevertheless, this work shows how users can quantify the frequency response of their own systems so that it will be possible to compare two systems in order to find filter parameters on each that produce equivalent results. These findings will also make it easier for users to replicate filters similar to other published results, even if they are using a different computer system.


Assuntos
Processamento de Imagem Assistida por Computador/normas , Tomografia Computadorizada de Emissão de Fóton Único/normas , Controle de Qualidade , Software
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...